专利摘要:
Selectively irradiating the slide 16 with a first light source 42 oriented obliquely to the surface 12 of the slide, to obtain a first image of the slide 16 irradiated by the first light source 42 Selectively irradiating the slide 16 with a second light source 46 that provides normally distributed light, obtaining a second image of the slide 16 irradiated by the second light 46 source, And generating a map of regions of interest based on the first image and the second image.
公开号:KR20000064473A
申请号:KR1019980704655
申请日:1996-12-13
公开日:2000-11-06
发明作者:에란 카플란;오퍼 샤피라;유발 하라리;다니엘 하크노치;리차드 에스. 에프. 스코트
申请人:마크 알. 루텐버그;뉴로메디컬 시스템즈, 인코포레이티드;
IPC主号:
专利说明:

Boundary Mapping System and Method
In the medical industry, specimens are often attached to slides to perform various testing and sorting functions using a microscope. In pathological analysis, specimens such as tissue parts, fluids, and stains from different body parts, for example, are placed on slides and covered with transparent coverslips or coverglasses with the optical properties required for microscopy use. The coverslip may serve to attach the specimen on the slide and / or act as a protective layer for the specimen, or both. Unfortunately, it is difficult to accurately position the coverslip in the correct position on the slide. Moreover, air can become trapped between the slide or specimen and the coverslip, forming undesirable inclusions or bubbles that interfere with viewing the specimen.
One pathological analysis using slides is the Pap smear test. In the Fab smear test, a sample of cell material is applied on a slide, stained and then covered with a glass or plastic coverslip. The fab smear is then analyzed using manual or automated microscopy to determine the presence of specific cells, such as malignant cells or before they become malignant in the specimen.
In particular, when automatically or semi-automatically classifying a sample, such as a fab smear sample on a slide, it is desirable to identify or identify a map of the area of interest on the slide on which the classification should be performed. As an example, it is advantageous to inform the classification system of the boundary of the coverslip so that the classification function is limited to the slide area containing the material to be classified. It is also desirable to inform the system of the location of bubbles or similar inclusions so that these areas are separated from the analysis. Not only does this allow for improved accuracy of certain tests, it can also reduce the processing time required for the system to analyze.
At present, the technicians manually map slides by digitizing the areas of the slide occupied by undesirable bubble and cover slip boundaries, so that these areas are not considered in the evaluation by the processor or the cell inspector. The operator uses a digitizing pen to draw a line around undesirable areas of the slide (ie bubbles, air inclusions, scratches, and coverslip boundaries). This method of manually mapping samples is known as an effective way to prepare slides for automated analysis. However, current manual methods are time consuming and expensive. Therefore, it is desirable to be able to automatically map the boundary of the specimen.
Summary of the Invention
The present invention provides an automated borderline mapping system and method thereof. The system uses a pair of light banks that direct scattered or obliquely incident light onto the slide to highlight the air bubbles trapped under the coverslip and to detect the boundaries of the coverslip. When the camera captures images of specimens and slides, the next processing system generates a boundary map of the areas that are not obscured by bubbles or the like within the coverslip boundaries. The system also displays the mapping results, allowing the user to modify the mapping process.
According to one embodiment of the invention, a method of mapping a region of a slide comprises the steps of selectively illuminating the slide with a first light source usually oriented obliquely to the surface of the slide, wherein the slide is irradiated by the first light source. Obtaining a first image, selectively irradiating a slide with a second light source that normally provides scattered light, obtaining a second image illuminated by the second light source, and based on the first image and the second image Solution generating a map of key areas.
According to another embodiment of the present invention, a slide mapping system comprises a first light source oriented obliquely to a slide surface for producing a first image, and a light source for dispersing light onto a slide surface for producing a second image. A second light source, a camera for obtaining a first image and a second image, and a processor for generating a map of major areas based on the first image and the second image.
According to another embodiment of the present invention, a method of judging sample mapping information includes generating a pixel contrast map of a sample, determining a location of interest in the sample, and applying odd or even numbers to pixels within the location of interest. Assigning (where the number assigned to each pixel represents the contrast of that pixel), assigning another odd or even number to other pixels (where the number assigned to each pixel represents the contrast of that pixel), pixel The odd numbered pixels are displayed with a different color characteristic than the even-numbered pixels, and the operator changes the pixel contrast map.
According to another embodiment of the invention, a method of detecting the position of bubbles in a slide comprises obtaining a first image of an illuminated slide under a first lighting condition, irradiated under a second irradiation condition Obtaining a second image of the slide, finding a boundary line in the first image and the second image and combining the boundary lines to form a third image, finding a boundary area defined by the boundary line in the third image, and a third image Calculating a gray scale contrast mean value for each region in the second image corresponding to the bounded region in, and using the calculated mean value for each region based on the gray scale contrast of the corresponding region in the first image. Comparing with a value.
According to another embodiment of the present invention, a method for finding a line in an image formed of a plurality of rows and columns includes adding the contrast value of the plurality of pixels in the rows with the contrast value of the preceding pixels in the row. Storing the sums for each of the pixels of, comparing the sums stored for a plurality of the plurality of pixels in the same column with a threshold, and storing the points on the line as a function of the pixels whose stored sums exceed a threshold; Estimating.
According to another embodiment of the present invention, a method for finding a line in an image formed of rows and columns of a plurality of pixels, the contrast value of the plurality of pixels of one row for the preceding pixels of the row. Storing a sum for each of the plurality of pixels in combination with a contrast value, comparing the stored sums for a plurality of the plurality of pixels in the same column with a threshold, a function of the pixels whose stored sums exceed a threshold Estimating the first point on the line as follows, dividing the image into a plurality of subimages, and determining the contrast value for the plurality of pixels in a row in each subimage for a plurality of rows adjacent to the estimated point. Obtaining a sum, comparing the obtained sum with a threshold, and estimating the position of additional points on the line as a function of pixels whose sum obtained exceeds the threshold. And a step.
According to another embodiment of the present invention, a method for judging mapping information for an image includes displaying a first map of an image having important areas distinguished from other areas in an image, and allowing an operator to display important areas on the display. Causing a change, and generating a second map in accordance with the first map and an arbitrary change by the operator.
These and other features of the invention will be fully described hereinafter and will be specifically pointed out in the accompanying drawings which set forth in detail the claims, the following description, and exemplary embodiments of the invention. On the other hand, it merely represents one of several ways in which the principles of the present invention may be used.
FIELD OF THE INVENTION The present invention generally relates to systems and methods for obtaining an image and detecting boundaries in the image, and more particularly to systems and methods for mapping the boundaries of a specimen. Preferably, the invention relates to a system and method for mapping regions of interest on a slide, such as regions containing specimens within a cover slip.
1 is a schematic diagram of an automatic classification system using boundary map information generated by a boundary mapping system and a boundary mapping system according to the present invention;
2 is a schematic diagram of the optical components of a borderline mapping system capable of generating such an image with borderline information highlighted in the image.
3 is a schematic diagram of the optical components of a borderline mapping system capable of generating such an image in which bubble or inclusion information of the image is highlighted.
4 is a schematic block diagram of an exemplary optical path showing the formation of an image with borderline information highlighted.
5 is a schematic block diagram of an exemplary optical path showing the formation of an image with bubble and inclusion information highlighted.
6 is a schematic representation of a separation apparatus of the present invention for finding coverslip boundaries in an image.
Referring to some figures, first of all, FIG. 1 shows an automatic boundary mapping system 10 of the present invention for providing boundary or mapping information, such as an exemplary cell classification system 12. The mapping system 10 includes a stage 14 on which the slide to be mapped is located, a camera 18, a light bank 20, 22, a diffuser 24, and a processing system for developing a boundary map. The mapping system 10 also includes a robot slide handler 28 for transferring slides between the storage box 30 and the stage 14, a barcode reader 32 for reading barcoded information from the slide 16, A display device 34 is included for facilitating interaction and allowing for review and modification of the mapping process.
As noted above, the slide matching system 10 is particularly useful for providing the automatic or semi-automatic sample classifier with information regarding the position of the specimen on the slide. In that context, slide mapping information can be used by the sample classifier to reduce the sorting time by confining the sorting function to areas where there is a high likelihood of having a biological sample on the slide. Moreover, by providing a map of the specimen on the slide, the accuracy of positioning the specimen and the coverslip in the correct area of the slide is not required, and the sample sorter can cover specimen slides with coverslips of various shapes and sizes. Can also be used with transients. Many exemplary sampling classification systems for which the mapping system 10 can provide mapping information are described in US Pat. Nos. 5,287,272, 5,257,182, 4,965,725, and US Patent Applications 07 / 425,665, 07 / 502,611, 08 / 196,982, the entire disclosure of which is incorporated herein by reference. Commercial sample classification system under the trademark PAPNET, Neuromedical Systems, Inc. of Suffern, New York. There is one produced by. On the other hand, the mapping device of the present invention has a wide range of potential applications, and is not limited to the use of specimen classifiers or slides or coverslips and slides, they are merely used in the exemplary means of describing the mapping system of the present invention. It is only.
The light banks 20 and 22 and the diffuser 24 help each other to produce various irradiation conditions incident on the slide. Each condition is tailored to highlight specific optical features and characteristics of the slide, thus highlighting the detection of features such as inclusions, bubbles, and coverslip boundaries. Selective use of the light weirs 20, 22 and the diffuser 24 may include a first image with border information highlighted (herein referred to as a border image) and a second image with bubbles and similar inclusions in the specimen highlighted (this specification). Produces independent images of the slide for viewing by the camera 18, such as a bubble image.
The borderline image is obtained by irradiating the slide 16 with an inclined light weir 20 which directs light to the slide at an angle substantially parallel to the top surface of the slide. Disperser 24 is not used to obtain a borderline image and thus rotates or moves from the field of view of camera 18. Light incident from the inclined light bank 20 to the boundary of the coverslip 35 is dispersed by the boundary and is more likely to be directed toward the camera 18 than light incident on the top surface of the slide 16 or the coverslip. This is explained in more detail below. This light captured by the camera 18, which forms a borderline image with the coverslip border, appears slightly brighter than the rest of the image. The borderline image is passed to a processing system 26 that locates the borderline in the borderline image.
Bubble images are obtained by inserting the disperser 24 into the field of view of the camera 18 adjacent to the slide 16 and irradiating the disperser with light from overhead light weirs 22 disposed on the slide and the disperser. . The diffuser 24 distributes the light so as to be incident on the slide 16 at various angles. Due to the difference in the refractive indices of the slide 16, the coverslip 35, the specimen, and the bubbles, the light entering the bubbles tends to be reflected more towards the camera 18 than the light entering the specimen, which is described below. This will be explained in more detail. Thus, the bubble in the final bubble image will appear brighter than other information in that image. The bubble image is passed to the processing system 26 such that a bubble boundary is detected in the image. The bubble image may also include information about the location of the scratches on the coverslips or the location of the inclusions in the specimen, which are collectively referred to herein as bubbles.
Based on the detected coverslip boundaries and bubbles, the processing system 26 generates a borderline map that can represent the area of the specimen within the range of coverslips excluding bubbles. This borderline map is read by the barcode reader 32 and recorded for use by the automatic sorting system 12 in association with the identifying information for the slide 16. A series of boundary maps may be stored on a recording medium, such as a magnetic or optical disk for each slide 16 in storage box 30, or may be electronically transferred to automatic sorting system 12, such as via a communication network. It may be. The classification system 12 may then use the boundary map to help classify the sample, such as localizing the sample to areas of the sample that are not disturbed by bubbles, inclusions, scratches, etc. within the coverslip.
Referring next to FIG. 2, the optical components of the boundary mapping system 10 for producing boundary and bubble images are shown in great detail. The boundary mapping system 10 includes a back plate 38 and a base 36 on which the various optical components of the matching system are mounted. At the center of the base 36 is a stage 14, to which a slide 16 to be mapped is fixed. The stage 14 may be formed directly on the base 36 to facilitate the automatic placement or removal of the slide on the stage, such as through the blocking 40, or may be an independent element mounted on the base. It may be. The stage 14 is positioned to secure the slide firmly to a known position on the same stage as the position where the slide is fixed in the system where the boundary mapping system 10 provides mapping information, such as the exemplary cell sorting system 12. It is preferred to include a device (not shown). Suitable positioning devices are disclosed in co-pending US patent application Ser. No. 08 / 498,321, which is incorporated herein by reference.
An inclined light bank 20 is also mounted to the base 36 and is oriented to project light to a substantially complete boundary of the slide 16. The inclined light weir 20 is located adjacent to each side of the slide 16 and has four independent positions slightly higher than the slide to direct light towards the slide at an oblique angle that is nearly parallel to the top surface of the slide. It is preferable to include the light source 42. The light source 42 may comprise an array 43 of LEDs or other suitable means for producing light.
The back plate 38 is equipped with a disperser assembly 44 for selectively positioning the camera 18, the processing light bank 22, and the disperser 24 within the field of view of the camera. The camera 18 is located at a wide angle with a distance just above the slide 16 and in view of all of the associated area of the slide, such as the portion of the slide that may include coverslips and specimens. Camera 18 may be any of several conventional cameras, such as CCD cameras, which may be used alone or in combination with other components such as an A / D converter, such as an image with a resolution of 640 × 480 pixels, for example. It is possible to produce digital outputs of sufficient resolution to process captured images.
The overhead light bank 22 is located between the slide 16 and the camera 18, with two independent light sources spaced around the camera's light path so as not to disturb the camera's field of view relative to the slide's associated area. (46). Although other suitable light sources may be used, the processing light source 46 is preferably an array of LEDs 48. The disperser assembly 44 is positioned between the slide 16 and the processing light bank 22, and can selectively position the disperser 24 in the optical path of the camera 18. Thus, the light emitted from the processed light bank 22 is distributed to the slide by the diffuser 24, and the light reflected from the slide is dispersed again and some of it is distributed toward the camera 18. Disperser 24 includes a light dispersing element 50 that disperses incident light, such as a mylar sheet, and also includes a frame 52 that supports the light dispersing element. Disperser assembly 44 places disperser 24 at a position slightly above slide 16 and in the optical path of camera 18 as shown in FIG. 2 when bubble images are obtained, and in FIG. 3 when borderline images are obtained. As shown, it includes an actuating device (not shown) that selectively positions outside the light path of the camera, such as adjacent back plate 38.
The conditions of the light banks 20 and 22, i.e. whether the light banks generate light, whether the position of the diffuser 24 is in the optical path of the camera 18, and the camera to obtain an image of the slide 16 The control of the camera including the instructions given is controlled by processor system 26 (FIG. 1). Processor system 26 may include optical weirs 20, 22, spreader assembly 44, camera 18, robot slide handler 28 as well as a suitable interface for receiving image data from the camera and slide identification information from a barcode reader. And a suitable microcomputer with a suitable interface for controlling the barcode reader 30.
In operation, once slide 16 is positioned over stage 14 and spreader 24 rotates out of the light path of camera 18, processing system 26 causes light source 42 of the inclined light bank 20 to lie. ) To illuminate the slide. As shown schematically for one boundary of the coverslip in FIG. 4, this illumination from the light source 42 located almost parallel to the slide 16 to the slide 16 is mainly due only to the coverslip boundary. At least some of the scattered light is directed back to the camera 18. Since light is incident on the upper surface 54 of the coverslip 35 or the upper surface of the slide 16 at various oblique angles, the incident light represented by the rays indicated by the arrows 58 is mainly reflected at the incident angle. And therefore does not enter the camera 18. Since the coverslip boundary is relatively rough and tends to disperse light, the light of the coverslip boundary is scattered and represented by the rays indicated by arrow 60, and part of the light represented by the rays indicated by arrow 62 is the camera. It is dispersed in the direction of (18). Once the slide is illuminated by the inclined light bank 20, the processing system 26 commands the camera 18 to capture an image of the illuminated slide 16. Because mainly light scattered from the coverslip 35 or slide 16 borders enters the camera, the borders will appear brighter than other areas of the slides and coverslips in the borderline image formed. In some cases, for example, if there are oxidized cells in the bubble under the coverslip 35, the obliquely incident light can be scattered and reflected towards the camera to detect the occurrence of such an event from the borderline image. You can do it.
Processing system 26 then deactivates the inclined light bank 20 and instructs disperser assembly 44 to rotate disperser 24 into the optical path of camera 18. The processing system 26 then activates the light source 46 of the processing light bank 22. Therefore, the light generated by the processed light bank 22 is irradiated to the slide 16 and the cover slip 35 through the disperser 24, as shown in FIG. (In Fig. 5, it should be noted that the disperser 24 is exaggerated from the slide 16 for illustrative purposes. The disperser is placed on the slide so that the side offset of the slide image appearing through the disperser is not significant. It is desirable to be located close enough.) The diffuser 24 distributes the light so as to be incident on the slide 16 and the coverslip 35 at various angles. By way of example, considering the light rays 64, the light rays from the light weir 20 are scattered by the spreader 24 in various directions including the light rays 66 and 68. Exemplary rays 66, 68 are partially transmitted through coverslip 35, and portions 70 are reflected at air and coverslip boundaries 72. The portion of the light rays 66, 68 transmitted to the coverslip 35 is then covered by the coverslip and slide 16 depending on the angle at which the light rays approach the boundary 74 and the difference in the refractive indices of the coverslip and bubbles or the specimen. It may be reflected at the boundary line 74 between the area 76 of the specimen 78 or bubble 80 sandwiched between the cover slip and the cover slip, or may be transmitted to the area 76. If there is a large difference in refractive index between the two media, as in the case of glass and air, then the light reflected from the boundary of the two media will have a similar refractive index as in the case of glass and specimen or other areas that do not contain other bubbles (collectively referred to herein) It is more than reflected at the boundary between the media.
Thus, since the difference in refractive index between the coverslip 35 and the specimen 78 is relatively small, only a very small portion of the ray 66 is reflected at the boundary line and most of it is transmitted through the boundary line so that it is mostly absorbed by the specimen 76. do. On the other hand, in the boundary 74 region between the cover slip 35 and the bubble 80, since the difference in refractive index is large, most of the light, such as the ray 68 incident on the bubble 80, is represented by the ray 82. As reflected towards the disperser 24. Moreover, most of the light transmitted to bubble 80 through boundary line 74 is reflected at the boundary line between bubble and slide 16 to increase the reflection due to the bubble. Rays 82 reflected by bubble 80 are again scattered by spreader 24, as represented by ray 84, and a portion of the ray enters camera 18 so that the bubble is larger than sample 78. Make it look bright. The processing system 26 instructs the camera 18 to capture the bubble image of the irradiated slide 16 and deliver that image to the processing system. Processing system 26 then returns slide 16 to storage box 30.
Once the two images are transferred to the processing system 26, the images are analyzed to detect bubbles and coverslip boundaries between the coverslip boundaries, as described in more detail below. Next, a slide 16 area boundary map that is likely to contain a sample is generated. A border map is a list of locations or identifications of pixels in an image that correspond to areas that are not bubbled within the coverslip border, such as areas that are likely to contain a sample. The border map is then correlated with information identifying the slide associated with the border map, such as through the information supplied by the barcode reader 32, and can limit the analysis to only those pixels that indicate the presence of a sample in the map. Stored for use by a system, such as an automatic sorting system 12.
Prior to delivery of the boundary map to the classification system 12, it is desirable for the operator to be given an opportunity to review the sample boundary map to verify that the map appears to be correct, corrected, or rejected.
Processing system 26 generates a borderline map by forming a coverslip mask and a bubble mask and then logically combining the maps to detect regions within the coverslip boundary that do not exhibit bubbles. The coverslip mask is formed from the coverslip boundaries detected in the boundary image. Processing system 26 removes the artificial element from the boundary image by first subtracting the reference image from the original boundary image. The reference image is obtained by capturing the image of the stage 14 with the slide removed using the boundary irradiation used to obtain the boundary image. Contrast offsets are added to the raw boundary image (to rule out the possibility of negative contrast occurring after the baseline image is subtracted), and any artifacts that may appear in the boundary image if not removed, such as a scratch on the next stage. Subtract the reference image from the raw image to remove it. Next, a number of preprocessing operations are performed on the deducted borderline image to filter the short and curved lines as well as taper the borderline in the image. Using the filtered image, long, straight borders that remain within certain windows at the left, right, top, and bottom of the slide are detected, which is closest to the center of the image. By detecting a straight line closer to the center of the slide, the coverslip boundary is distinguished from the slide boundary.
The coverslip boundary in each window is detected using a recursive dividing projection algorithm, which detects the initial point closest to the center of the coverslip boundary and uses that point as a starting point for subsequent image nutrients. The image is bisected repeatedly using the center point detected in the previous dividing step until the algorithm concentrates on a series of points on a small separate piece of cover slip boundary that represents the cover slip boundary relatively accurately.
For the sake of explanation, consider the original image and cumulative projection pixel images shown below as well as FIG. 6. (The raw image has a gray scale contrast of 1 and 0 for each pixel, in contrast to the actual gray scale intensities, to facilitate ease of explanation and summation in the cumulative projection image. It is consistent with the filtered border image except that it is displayed, and the presented image is only a partial depiction of many pixels of the window image.) The raw image is a window containing one horizontal cover slip and the area where the slide border lies. The pixel contrast for the pixels is shown. In this example, the top of the raw image is the terminal border of the borderline image, and the bottom of the raw image is pixels closer to the center of the borderline image. The cumulative projection image is projected with a one-dimensional profile of the rightmost pixels, with a continuous summation of the left and right values of the contrast values of all the pixels in the same row of the original image (ie, the same distance from the image boundary). . As an example, the first nine columns of the first row of the raw image all have a contrast of zero, so that the corresponding pixels of the cumulative projection image are also all zero. As the pixels in the tenth and eleventh columns each have a contrast of one, the pixel in the tenth column of the cumulative projection image is 1, i.e. the sum of the contrast of the nine pixels preceding the row and the pixels in the tenth column. Where the value for the pixels in column 11 is the sum of the preceding ten pixels (ie 1) plus the contrast of the eleventh pixel in the same row of the raw image, the sum being two. Since the contrasts of the pixels in columns 12 and 13 of the original image are both zero, the corresponding pixels in the cumulative projection image also all have a value of 2, that is, the sum of the contrasts of all pixels preceding them in the same row in the original image. Have The rightmost columns of pixels, shown below, where the same summation is done from rows 2 to 8, is cumulative with a one-dimensional profile with a sum of 0, 3, 4, 5, 2, 0, 4, 2 from bottom up. Form the projected image.
Raw border image:
0 0 0 0 0 0 0 0 0 1 1 0 0
1 0 0 0 0 0 0 1 1 1 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 1 1 0 0 0 0
0 0 0 0 0 0 0 0 1 1 1 1 1
0 0 0 0 1 1 1 1 0 0 0 0 0
1 1 1 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0
Cumulative Projected Image:
0 0 0 0 0 0 0 0 0 1 2 2 2
1 1 1 1 1 1 1 2 3 4 4 4 4
0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 1 2 2 2 2 2
0 0 0 0 0 0 0 0 1 2 3 4 5
0 0 0 0 1 2 3 4 4 4 4 4 4
1 2 3 3 3 3 3 3 3 3 3 3 3
0 0 0 0 0 0 0 0 0 0 0 0 0
In order to find the coverslip boundary (e.g. boundary 90 in FIG. 6) using the cumulative projection image, the sum of values exceeding a predetermined threshold from the pixels closest to the center of the boundary image to the outer boundary line is obtained. The branch finds the first pixel and examines the profile for the initial search window 92, which is likely to contain a border line. (To distinguish the coverslip border from the slide border, the pixels in the profile are examined starting from the pixel closest to the center, as it is likely that both borders are shown in the image and that the coverslip border is more than the slide border. Is calculated based on the summation value for each pixel above the threshold until the weighted average of each pixel following it is met until this pixel meets a pixel having a sum less than the threshold. do. If the threshold value is 2.5, the weighted average value is calculated for the seventh, sixth, and fifth times from the profile boundary, and these pixels have a sum of 3, 4, and 5, respectively. The weighted average is then an estimated point 94 over the coverslip boundary 90 at the center of the initial search window 92. Thus, the first detected boundary line yields an initial assumed horizontal edge 96.
Next, the cumulative projection image is divided into two search windows 98, 100, i.e., sub-images, and the center point 94 detected in the last calculation is divided into each sub-image 98 made by dividing the previous image 92. It is used as a starting point for finding a new weighted mean value and a new estimated boundary line (102, 104). These subimages 98, 100 are bisected, and the bisected images are subdivided again in a repeating format until each image contains a relatively small number of pixels representing a relatively small distance from the raw boundary image. The boundary line detected in this way can follow the actual boundary line 90 relatively accurately and follow the irregularities in the actual boundary line. One of the advantages of using a cumulative projection image is that the profile of each subimage is not caused by an excessive summation of each subimage, but rather by the contrast of the subimages in the contrast of the pixels of the cumulative projection image corresponding to the rightmost pixel of the subimage. It has been mentioned that it can be calculated by subtracting the contrast of the pixels of the cumulative projection image corresponding to the leftmost pixel. In order to fill the gap between the calculated coverslip boundary points, a least squares fit function or similar curve-fitting function may be used. The expected boundary is examined to ensure that it is actually the coverslip boundary, such as summing the parts of the profile adjacent to the coverslip boundary calculated for each of the last series of subimages and checking whether the sum exceeds the threshold contrast. Is preferred.
While the above pretreatment operation to remove short or curved lines in the borderline image includes some well-known conventional morphological filtration operation, the one-dimensional morphological opening operation, i.e., to obtain a similar effect on expansion after corrosion, It is desirable to perform a cumulative opening operation on the cumulative projection image prior to the application of the described nutrient projection algorithm. Manipulation with horizontal boundaries creates two paths across the rows of the cumulative projected image. The manipulation in the first path scans from right to left, converting the cumulative sum in pixels corresponding to the boundary points to be removed from the image from positive to negative. As an example, consider the rows illustrated below for raw and cumulative projection images. The pixels of the cumulative projection row to be removed are those whose sum of n pixels in the raw image is less than a predetermined threshold. For example, say "n" is 4 and the threshold is 4 as well. The sum of the n pixels for each pixel in the row is determined by subtracting the nth pixel from the current pixel in the cumulative projection row to the left from the value of the current pixel in the cumulative projection row going from right to left. . This has the effect of corrosion of n pixels, in this case 4 pixels. An expansion of n pixels, in this case an expansion of 4 pixels, sets the counter to "n" whenever the sum of the n pixels is greater than or equal to the threshold and decrements the counter at each step to reduce the counter row below. Get When the counter row value is less than or equal to 0, the values of the corresponding pixels in the cumulative projection row are converted from a positive value to a negative value to obtain a path row from left to right below.
Raw rows:
1 1 1 0 1 1 1 1 1 0 0 1
Projection row:
1 2 3 3 4 5 6 7 8 8 8 9
Counter row:
-3 -2 -1 0 1 2 3 4 4 -2 1 0
Path line from left to right:
-1 -2 -3 -3 4 5 6 7 8 -8 8 9
The second path runs from left to right and calculates a new image by remembering the new cumulative sum and covering the error from the original cumulative sum. If the corresponding pixel in the path from right to left is positive, then the new output pixel for the path from left to right is equal to the current pixel plus error. If the corresponding pixel in the path from right to left is negative, the new output pixel for the path from left to right is the same as the preceding output pixel. The current pixel in the path from left to right and the current pixel in the error row are updated with the preceding output pixel plus the current pixel. For example, if the leftmost pixel in the right-to-left path is "1", since there is no preceding error pixel, the leftmost pixel in the left-to-right path will be "0", so that the preceding output pixel Therefore, the leftmost pixel in the error row is "-1" and the current pixel in the path from right to left is "-1". The second pixel in the path from left to right will again be 0 because the corresponding pixel in the path from right to left is negative "-2" and the preceding pixel from left to right is "0". The second pixel in the error row is because the preceding pixel in the path from left to right is "0", the current pixel in the path from right to left is "-2", and the sum of these values is "-2". Therefore, it becomes "-2". Since the value of the corresponding pixel in the path from right to left is "4", the current pixel in the error path is not updated and is "-3" and the sum of "4" and "-3" is "1", The fifth pixel from the left in the path from left to right (the first has a corresponding pixel whose value is positive in the path from right to left) will be "1". The rest of the row is calculated according to these examples.
Error line:
-1 -2 -3 -3 -3 -3 -3 -3 -3 -3 -3 -4
Path line from left to right:
0 0 0 0 1 2 3 4 5 5 5 5
The values in the cumulative projection image are then replaced by the values in the corresponding left to right path row, and the boundary is detected in the image as described above.
A cover slip mask is generated based on the detected cover slip boundaries, which distinguishes areas within the coverslip determined by the four coverslip boundaries from areas outside of the coverslip.
Processing system 26 then forms a bubble mask based on the raw bubble image as well as the boundary lines detected in both the bubble image and the borderline image. First, a borderline image is detected in a thresholded image, which is thresholded and borderline pixels, to produce a border associated with bubbles and coverslips, including oxidized cells that are also visible in the borderline image. Areas of these oxidized cells, also known as cornflakes, can be isolated, recorded and provided to a sorting system.
Before the bubble image is analyzed by the processing system 26, the artificial product in the bubble image is removed by subtracting the reference image from the original bubble image. The reference image is obtained by capturing an image of an empty slide without coverslips using the bubble irradiation technique used to obtain the raw bubble image, including the use of a disperser 24 as described above. The offset is added to the bubble image before subtracting the reference image to ensure that the resulting image contains both positive pixel contrast values. The boundaries in the resulting bubble image are detected using conventional morphological boundary detectors and thresholding operations. These boundary lines are combined with the boundary lines detected in the boundary image to create a combined boundary image. Since this image is likely to contain small gaps in the borders, expansion operations are used to expand the appropriate borders in all directions to close the gaps. Since the combined boundary now includes a number of contacted or connected regions defined and defined by the connected boundaries of the image, these regions can be analyzed to determine whether they represent bubbles or sample material.
In order to distinguish the connected areas representing the bubbles from the connected areas representing the sample material, a mean value gray scale contrast is determined for each of the connected areas using a bar graph. Based on whether the average value for each of the connected areas exceeds one of the two thresholds, it is determined whether the connected areas contain bubbles or sample material. The threshold for applying to a particular connected area is determined by the brightness of the same area in the raw boundary image. Bubbles containing oxidized cells are not as bright in the bubble image but appear bright in the raw border image, so that a relatively low threshold corresponds to the bright areas in the raw border image to determine if the connected areas are bubbles. Applies to connected areas in the image. For connected regions that appear dark in the raw boundary image, a relatively high threshold is applied to distinguish whether they correspond to bubbles or to sample material. Areas exceeding the applied threshold are determined to represent bubbles to form a bubble mask.
By logically combining the coverslip mask and the bubble mask, a borderline map of the regions of interest of the slide, i.e., the regions containing the sample material within the boundaries of the coverslip, is obtained.
In some cases, it may also be desirable to provide the operator with an indication of the degree of confidence that the processing system has evolved to generate an accurate boundary map to help the operator observe only certain boundary maps for accuracy. Confidence in the accuracy of the boundary map can be estimated using various measures. This measurement method includes the detection of bright bubble areas outside the cover slip boundary, the error between the position where the slide was detected and the corrected position, the error in the rotation between the position where the slide was detected and the corrected position, and the detected cover. The parallel error of the slip boundaries, whether bright areas not included in the bubbles detected in the image were detected, the difference between the background of the slide and the corrected background, and the total bubble area detected.
In order for the operator to review the boundary map, the processing system 26 generates a mapped image for display on the monitor 34. The mapped image combines the bubble images and the borderline images and covers the combined image with a transparent cover with colored areas indicating that the areas are excluded from the map sent to the classification system 12, for example green areas. Is an image of the slide 16 obtained. The cover assigns these pixels to specific identifiers, such as creating grayscale contrast of those pixels that are all odd or even corresponding to the sample material of the combined image, and not odd or even numbers assigned to the pixels corresponding to the sample material. Generated by assigning pixels corresponding to regions excluded from the boundary map to different identifiers, such as creating a gray scale contrast of other pixels. For example, the contrast for each pixel in the combined image corresponding to the area to be excluded from the map is assigned the nearest even number, and the contrast for each pixel in the combined image corresponding to the area of the sample material in the map is It can be assigned to the nearest odd number. The contrast of each pixel that must be changed is preferably changed one by one to preserve the completeness of the contrast value for that pixel and thus to preserve the completeness of the entire image. Pixels that are exactly odd or even without change are, of course, not changed. As an example, consider two pixels in a combined image, a first pixel corresponding to a bubble of the combined image and having a gray scale contrast of 199 and a second pixel corresponding to a sample material and having a gray scale contrast of 150, The gray scale contrast of the first pixel is changed to a contrast value of 200 to indicate that the pixel corresponds to an area excluded from the boundary map, and the gray scale contrast of the second pixel is a pixel corresponding to an area included in the map. Will be changed to a contrast value of 151.
Next, a look up table is used to determine red, green, and blue contrast for each pixel in the display device. The lookup table consists of red, green and blue contrasts for each of the possible gray scale pixel contrasts, for example 256 gray scale contrasts. In the lookup table, the odd gray scale contrasts are all equivalent and are assigned to respective red, green and blue contrasts corresponding to the input gray scale contrasts. Even gray scale contrasts are assigned zero red and blue contrasts and a green contrast corresponding to the input gray scale contrast. Thus, for example, for a pixel that corresponds to the sample material and has an odd gray scale pixel bright input of 151, the lookup table will provide 151 red, green, and blue outputs, respectively. For pixels that correspond to areas that are excluded from the border map, such as bubbles, and have an even gray scale pixel contrast input of 200, for example, the lookup table will provide 0 red and blue output contrasts and 200 green contrast.
Thus, the areas of the mapped image that are transmitted to the classification system 10 as part of the boundary map are black and white on the display device 34 and are located like areas containing bubbles and areas located outside of the coverslip. Areas excluded from the map are shown in green. To make the contrast odd or even, the gray scale contrasts of the pixels only need to be changed one by one, so the relative contrasts of the pixels of the mapped image are substantially maintained so that the operator can observe the image and reliably judge the accuracy of the border map. You can also modify the map if necessary. Modification of the borderline map may be performed through the use of a light pen, mouse, or other suitable interface that allows the operator to instruct the processing system 26 whether to include in the map or to exclude from the map.
权利要求:
Claims (49)
[1" claim-type="Currently amended] Selectively irradiating the slide with a first light source oriented obliquely to the surface of the slide;
Obtaining a first image of the slide irradiated by the first light source;
Selectively irradiating the slide with a second light source that provides normally distributed light;
Obtaining a second image of the slide irradiated by the second light source; And
Generating a map of regions of interest based on the first image and the second image.
[2" claim-type="Currently amended] The method of claim 1,
Irradiating the slide with the first light source,
Examining the plurality of sides of the slide.
[3" claim-type="Currently amended] The method of claim 1,
Irradiating the slide with the second light source,
Positioning the disperser between the second light source and the slide when the second light source irradiates the slide, and positioning the disperser at a different position when the first light source irradiates the slide. How to map regions of a slide.
[4" claim-type="Currently amended] A first light source usually obliquely oriented to the surface of the slide to produce a first image;
A second light source providing light normally distributed to the surface of the slide to produce a second image;
A camera for obtaining the first image and the second image; And
And a processor for generating a map of important areas based on the first image and the second image.
[5" claim-type="Currently amended] The method of claim 4, wherein
And the second light source comprises a diffuser for dispersing light.
[6" claim-type="Currently amended] The method of claim 4, wherein
And the first light source is directed such that light incident on the slide is incident from four sides of the slide.
[7" claim-type="Currently amended] The method of claim 4, wherein
Light from the second light source is directed through the diffuser to disperse light.
[8" claim-type="Currently amended] The method of claim 7, wherein
And the disperser may be selectively positioned between a position within the field of view of the camera and a position outside the field of view of the camera.
[9" claim-type="Currently amended] camera;
A disperser that can be selectively positioned at a first position within the field of view of the camera and a second position outside the field of view of the camera,
The camera obtains a first image when the disperser is in the first position and obtains a second image when the disperser is in the second position; And
And a processor for generating a map of important areas based on the first image and the second image.
[10" claim-type="Currently amended] The method of claim 9,
Slide mapping system, characterized in that it comprises a first light source usually oriented obliquely to the surface of the slide.
[11" claim-type="Currently amended] The method of claim 9,
And a second light source directed to direct light through the spreader to the slide when the spreader is in the first position.
[12" claim-type="Currently amended] A first light source oriented obliquely to the surface to produce a first image;
A second light source for providing light to the surface;
A diffuser for dispersing light from the second light source reflected by the surface to make a second image;
A camera for obtaining the first image and the second image; And
And a processor for generating a map of regions of interest based on the first image and the second image.
[13" claim-type="Currently amended] The method of claim 12,
And the disperser may be selectively positioned between a position within the field of view of the camera and a position outside the field of view of the camera.
[14" claim-type="Currently amended] Generating a pixel contrast map for the sample;
Determining locations of interest in the sample;
Assigning either odd or even to pixels in the locations of interest,
The number assigned to each pixel represents the contrast of that pixel;
Assigning the other of the odd or even to pixels other than the pixels of interest positions,
The number assigned to each pixel represents the contrast of that pixel; And
Displaying the pixels;
And the odd numbered pixels are displayed with different color characteristics from the even numbered pixels.
[15" claim-type="Currently amended] The method of claim 14,
And pixels in the positions of interest are displayed in black and white.
[16" claim-type="Currently amended] The method of claim 14,
And pixels other than the positions of interest are displayed in a specific color.
[17" claim-type="Currently amended] The method of claim 14,
Odd number is assigned to pixels in the locations of interest.
[18" claim-type="Currently amended] The method of claim 14,
And an even number is assigned to pixels other than the positions of interest.
[19" claim-type="Currently amended] The method of claim 14,
And a number assigned to one pixel is different from the gray scale contrast of the pixel and visually meaningless when the pixel is displayed.
[20" claim-type="Currently amended] The method of claim 17,
And a number assigned to one pixel is different from the gray scale contrast of the pixel and visually meaningless when the pixel is displayed.
[21" claim-type="Currently amended] The method of claim 14,
And allowing the user to modify the pixel contrast map.
[22" claim-type="Currently amended] The method of claim 14,
Generating the pixel contrast map,
Selectively irradiating the slide comprising the specimen with a first light source usually oriented obliquely to the surface of the slide;
Obtaining a first image of the slide irradiated by the first light source;
Selectively irradiating the slide with the second light source;
Obtaining the second image irradiated by the second light source; And
And generating a map of important areas based on the first image and the second image.
[23" claim-type="Currently amended] Generating a pixel contrast map of the sample;
Determining locations of interest in the sample;
Assigning either odd or even to pixels in the locations of interest,
The number assigned to each pixel represents the contrast of that pixel;
Assigning the other of the odd or even to pixels other than the positions of interest,
The number assigned to each pixel represents the contrast of that pixel;
Displaying the pixels;
The odd-numbered pixels are displayed with different color characteristics than the even-numbered pixels; And
And allowing an operator to change the pixel contrast map.
[24" claim-type="Currently amended] The method of claim 23, wherein
And the pixels in the locations of interest are displayed in black and white.
[25" claim-type="Currently amended] The method of claim 23, wherein
And pixels other than the positions of interest are displayed in a specific color.
[26" claim-type="Currently amended] The method of claim 23, wherein
And an odd number is assigned to the pixels in the positions of interest.
[27" claim-type="Currently amended] The method of claim 23, wherein
And an even number is assigned to pixels other than the positions of interest.
[28" claim-type="Currently amended] The method of claim 23, wherein
And the number assigned to one pixel is visually insignificantly different from the gray scale contrast of the pixel when the pixel is displayed.
[29" claim-type="Currently amended] The method of claim 26,
And the number assigned to one pixel is visually insignificantly different from the gray scale contrast of the pixel when the pixel is displayed.
[30" claim-type="Currently amended] Obtaining a first image of the slide irradiated under a first irradiation condition;
Obtaining a second image of the slide irradiated under a second irradiation condition;
Finding boundary lines in the first image and the second image and combining the boundary lines to form a third image;
Detecting bounded regions defined by the boundary lines in the third image;
Calculating a gray scale contrast average value for each region in the second image corresponding to the bounded region in the third image; And
And comparing the average values calculated for the respective areas with a threshold based on the gray scale contrast of the corresponding area in the first image.
[31" claim-type="Currently amended] The method of claim 30,
And said first irradiation condition comprises irradiating light incident obliquely on said slide.
[32" claim-type="Currently amended] The method of claim 30,
And said second irradiation condition comprises irradiating light scattered on said slide.
[33" claim-type="Currently amended] The method of claim 30,
And connecting the gaps in the boundary line of the third image.
[34" claim-type="Currently amended] The method of claim 30,
Wherein said regions with calculated average values exceeding an associated threshold in said second image are determined to represent a bubble.
[35" claim-type="Currently amended] The method of claim 30,
Regions in the second image that correspond to regions in the first image having a relatively high gray scale contrast are in the second image that correspond to regions of the first image having the relatively low gray scale contrast A method for detecting the position of a bubble in a slide, characterized in that it is compared to a threshold lower than areas.
[36" claim-type="Currently amended] A method for detecting lines in an image formed of multiple rows and columns of pixels,
Summing the contrast values for the plurality of pixels in one row with the contrast values for the preceding pixels in the row to store a sum for each of the plurality of pixels;
Comparing the stored values for the plurality of pixels in the same column with a threshold; And
Estimating a point on the line as a function of pixels having stored sums exceeding the threshold.
[37" claim-type="Currently amended] The method of claim 36,
And the estimating step comprises performing a weighted average of the pixels and the stored sums.
[38" claim-type="Currently amended] The method of claim 36,
Using the estimated point to estimate the position of another point on the line.
[39" claim-type="Currently amended] The method of claim 36,
Dividing the image into a plurality of sub-images and using the estimated points to estimate additional positions on the line.
[40" claim-type="Currently amended] The method of claim 39,
For the plurality of rows adjacent to the estimated point, obtaining sums of contrast values for the plurality of pixels in one row in each subimage, and comparing the sums with a threshold value Line detection method in an image to be described.
[41" claim-type="Currently amended] The method of claim 40,
Obtaining the sums comprises subtracting a stored sum for a pixel in one terminal column of the subimage from a stored sum in the other terminal column of the subimage.
[42" claim-type="Currently amended] A method for detecting lines in an image formed of multiple rows and columns of pixels,
Summing the contrast values for the plurality of pixels in one row with the contrast values for the preceding pixels in the row to store a sum for each of the plurality of pixels;
Comparing the stored values for the plurality of pixels in the same column with a threshold;
Estimating a first point on the line as a function of pixels with stored sums exceeding the threshold;
Dividing the image into a plurality of subimages;
Obtaining, for a plurality of rows adjacent to the estimated point, sums of contrast values for the plurality of pixels in one row in each of the subimages;
Comparing the obtained sums with the threshold; And
Estimating a position of additional points on the line as a function of pixels having the obtained sums exceeding the threshold.
[43" claim-type="Currently amended] The method of claim 42,
Obtaining the sums comprises subtracting a stored sum for a pixel in one terminal column of the subimage from a stored sum in the other terminal column of the subimage.
[44" claim-type="Currently amended] The method of claim 42,
Estimating the position of the additional points comprises performing a weighted average of the pixels and the sums obtained.
[45" claim-type="Currently amended] Displaying a first map of the image with significant regions distinct from other regions in the image;
Allowing an operator to change important areas in the display; And
And generating a second map in accordance with any change made by the first map and an operator.
[46" claim-type="Currently amended] The method of claim 45,
And the operator can make the change using a mouse.
[47" claim-type="Currently amended] The method of claim 45,
And said operator can make said change using a light pen.
[48" claim-type="Currently amended] The method of claim 5,
And the disperser may be selectively positioned within the field of view of the camera.
[49" claim-type="Currently amended] The method of claim 4, wherein
And the first light source is directed such that light incident on the slide is incident from a plurality of sides of the slide.
类似技术:
公开号 | 公开日 | 专利标题
DK2973397T3|2017-10-02|Tissue-object-based machine learning system for automated assessment of digital whole-slide glass
US9547801B2|2017-01-17|Methods of chromogen separation-based image analysis
JP5932110B2|2016-06-08|Method and apparatus for detecting / identifying rough inclusion
Forero et al.2004|Identification of tuberculosis bacteria based on shape and color
US8139841B2|2012-03-20|Visual inspection method and apparatus and image analysis system
US5764792A|1998-06-09|Method and apparatus for processing images
US7783098B2|2010-08-24|Method and apparatus for automated image analysis of biological specimens
ES2409268T3|2013-06-26|Procedure for classifying color images of serum samples from centrifuged body fluids
US4097845A|1978-06-27|Method of and an apparatus for automatic classification of red blood cells
US7428325B2|2008-09-23|Method and apparatus for automated image analysis of biological specimens
US6396949B1|2002-05-28|Machine vision methods for image segmentation using multiple images
EP0585759B1|2001-10-17|Process for detecting and mapping dirt on the surface of a photographic element
US7133547B2|2006-11-07|Method for quantitative video-microscopy and associated system and computer software program product
JP5044633B2|2012-10-10|Quantitative video microscopy and related system and computer software program products
US3999047A|1976-12-21|Method and apparatus utilizing color algebra for analyzing scene regions
US7555161B2|2009-06-30|Image analysis
DE3689856T3|2005-03-17|Analysis process and device for biological specimen.
US6900426B2|2005-05-31|Reverse focusing methods and systems
Forero et al.2006|Automatic identification of Mycobacterium tuberculosis by Gaussian mixture models
AU709136B2|1999-08-19|Automatic focusing of biomedical specimens apparatus
EP0829065B1|2003-07-30|Method for assessing slide and specimen preparation quality
JP2018511787A|2018-04-26|Model-based method and apparatus for classifying interfering factors in a sample
JP3111023B2|2000-11-20|Method for recognizing one or more radiological image areas
US8350905B2|2013-01-08|Microscope system, image generating method, and program for practicing the same
AU692274B2|1998-06-04|Intensity texture based classification system and method
同族专利:
公开号 | 公开日
EP0868705A4|1999-04-14|
EP0868705B1|2003-06-11|
EP0868705A1|1998-10-07|
WO1997022946A1|1997-06-26|
AU1334697A|1997-07-14|
NO982821D0|1998-06-18|
AU720182B2|2000-05-25|
US5835620A|1998-11-10|
JP2000508095A|2000-06-27|
AT242898T|2003-06-15|
NO982821L|1998-08-14|
CA2240948A1|1997-06-26|
DE69628661D1|2003-07-17|
CN1208486A|1999-02-17|
BR9612194A|1999-12-28|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
法律状态:
1995-12-19|Priority to US08/576,988
1995-12-19|Priority to US8/576,988
1996-12-13|Application filed by 마크 알. 루텐버그, 뉴로메디컬 시스템즈, 인코포레이티드
1996-12-13|Priority to PCT/US1996/019987
2000-11-06|Publication of KR20000064473A
优先权:
申请号 | 申请日 | 专利标题
US08/576,988|US5835620A|1995-12-19|1995-12-19|Boundary mapping system and method|
US8/576,988|1995-12-19|
PCT/US1996/019987|WO1997022946A1|1995-12-19|1996-12-13|Boundary mapping system and method|
[返回顶部]